Confidence Evaluation for Combining Diverse Classifiers

نویسندگان

  • Hongwei Hao
  • Cheng-Lin Liu
  • Hiroshi Sako
چکیده

For combining classifiers at measurement level, the diverse outputs of classifiers should be transformed to uniform measures that represent the confidence of decision, hopefully, the class probability or likelihood. This paper presents our experimental results of classifier combination using confidence evaluation. We test three types of confidences: log-likelihood, exponential and sigmoid. For re-scaling the classifier outputs, we use three scaling functions based on global normalization and Gaussian density estimation. Experimental results in handwritten digit recognition show that via confidence evaluation, superior classification performance can be obtained using simple combination rules.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Class-confidence critic combining

This paper discusses a combination of two techniques for improving the recognition accuracy of on-line handwritten character recognition: committee classification and adaptation to the user. A novel adaptive committee structure, namely the Class-Confidence Critic Combination (CCCC) scheme, is presented and evaluated. It is shown to be able to improve significantly on its member classifiers. Als...

متن کامل

On Adaptive Confidences for Critic-Driven Classifier Combining

When combining classifiers in order to improve the classification accuracy, precise estimation of the reliability of each member classifier can be very beneficial. One approach for estimating how confident we can be in the member classifiers’ results being correct is to use specialized critics to evaluate the classifiers’ performances. We introduce an adaptive, critic-based confidence evaluatio...

متن کامل

Exploration of classification confidence in ensemble learning

Ensemble learning has attracted considerable attention owing to its good generalization performance. The main issues in constructing a powerful ensemble include training a set of diverse and accurate base classifiers, and effectively combining them. Ensemble margin, computed as the difference of the vote numbers received by the correct class and the another class received with the most votes, i...

متن کامل

استفاده از یادگیری همبستگی منفی در بهبود کارایی ترکیب شبکه های عصبی

This paper investigates the effect of diversity caused by Negative Correlation Learning(NCL) in the combination of neural classifiers and presents an efficient way to improve combining performance. Decision Templates and Averaging, as two non-trainable combining methods and Stacked Generalization as a trainable combiner are investigated in our experiments . Utilizing NCL for diversifying the ba...

متن کامل

Multilevel Data Classification and Function Approximation Using Hierarchical Neural Networks

Combining diverse features and multiple classifiers is an open research area in which no optimal strategy is found but successful experimental studies have been performed depending on a specific task at hand. In this chapter, a strategy for combining diverse features and multiple classifiers is presented as an exemplary new model in multilevel data classification using hierarchical neural netwo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003